139 research outputs found

    Are all reversible computations tidy?

    Full text link
    It has long been known that to minimise the heat emitted by a deterministic computer during it's operation it is necessary to make the computation act in a logically reversible manner\cite{Lan61}. Such logically reversible operations require a number of auxiliary bits to be stored, maintaining a history of the computation, and which allows the initial state to be reconstructed by running the computation in reverse. These auxiliary bits are wasteful of resources and may require a dissipation of energy for them to be reused. A simple procedure due to Bennett\cite{Ben73} allows these auxiliary bits to be "tidied", without dissipating energy, on a classical computer. All reversible classical computations can be made tidy in this way. However, this procedure depends upon a classical operation ("cloning") that cannot be generalised to quantum computers\cite{WZ82}. Quantum computations must be logically reversible, and therefore produce auxiliary qbits during their operation. We show that there are classes of quantum computation for which Bennett's procedure cannot be implemented. For some of these computations there may exist another method for which the computation may be "tidied". However, we also show there are quantum computations for which there is no possible method for tidying the auxiliary qbits. Not all reversible quantum computations can be made "tidy". This represents a fundamental additional energy burden to quantum computations. This paper extends results in \cite{Mar01}.Comment: 7 pages, 1 figure ep

    How statistical are quantum states?

    Full text link
    A novel no-go theorem is presented which sets a bound upon the extent to which '\Psi-epistemic' interpretations of quantum theory are able to explain the overlap between non-orthogonal quantum states in terms of an experimenter's ignorance of an underlying state of reality. The theorem applies to any Hilbert space of dimension greater than two. In the limit of large Hilbert spaces, no more than half of the overlap between quantum states can be accounted for. Unlike other recent no-go theorems no additional assumptions, such as forms of locality, invasiveness, or non-contextuality, are required.Comment: 5 pages. Noise tolerant calculation removed in favour of better calculation in forthcoming articl

    Landauer's erasure principle in non-equilibrium systems

    Full text link
    In two recent papers, Maroney and Turgut separately and independently show generalisations of Landauer's erasure principle to indeterministic logical operations, as well as to logical states with variable energies and entropies. Here we show that, although Turgut's generalisation seems more powerful, in that it implies but is not implied by Maroney's and that it does not rely upon initial probability distributions over logical states, it does not hold for non-equilibrium states, while Maroney's generalisation holds even in non-equilibrium. While a generalisation of Turgut's inequality to non-equilibrium seems possible, it lacks the properties that makes the equilibrium inequality appealing. The non-equilibrium generalisation also no longer implies Maroney's inequality, which may still be derived independently. Furthermore, we show that Turgut's inequality can only give a necessary, but not sufficient, criteria for thermodynamic reversibility. Maroney's inequality gives the necessary and sufficient conditions.Comment: 9 pages, no figure

    The (absence of a) relationship between thermodynamic and logical reversibility

    Full text link
    Landauer erasure seems to provide a powerful link between thermodynamics and information processing (logical computation). The only logical operations that require a generation of heat are logically irreversible ones, with the minimum heat generation being kTln⁥2kT \ln 2 per bit of information lost. Nevertheless, it will be shown logical reversibility neither implies, nor is implied by thermodynamic reversibility. By examining thermodynamically reversible operations which are logically irreversible, it is possible to show that information and entropy, while having the same form, are conceptually different.Comment: 19 pages, 5 figures. Based on talk at ESF Conference on Philosophical and Foundational Issues in Statistical Physics, Utrecht, November 2003. Submitted to Studies in History and Philosophy of Modern Physic

    The Physical Basis of the Gibbs-von Neumann entropy

    Get PDF
    We develop the argument that the Gibbs-von Neumann entropy is the appropriate statistical mechanical generalisation of the thermodynamic entropy, for macroscopic and microscopic systems, whether in thermal equilibrium or not, as a consequence of Hamiltonian dynamics. The mathematical treatment utilises well known results [Gib02, Tol38, Weh78, Par89], but most importantly, incorporates a variety of arguments on the phenomenological properties of thermal states [Szi25, TQ63, HK65, GB91] and of statistical distributions[HG76, PW78, Len78]. This enables the identification of the canonical distribution as the unique representation of thermal states without approximation or presupposing the existence of an entropy function. The Gibbs-von Neumann entropy is then derived, from arguments based solely on the addition of probabilities to Hamiltonian dynamics.Comment: 42 pages, no figures (3rd version substantial revision and simplification of central argument incorporating adiabatic availability and passive distributions

    Constraints on Macroscopic Realism Without Assuming Non-Invasive Measurability

    Get PDF
    Macroscopic realism is the thesis that macroscopically observable properties must always have definite values. The idea was introduced by Leggett and Garg (1985), who wished to show a conflict with the predictions of quantum theory. However, their analysis required not just the assumption of macroscopic realism per se, but also that the observable properties could be measured non-invasively. In recent years there has been increasing interest in experimental tests of the violation of the Leggett-Garg inequality, but it has remained a matter of controversy whether this second assumption is a reasonable requirement for a macroscopic realist view of quantum theory. In a recent critical assessment Maroney and Timpson (2017) identified three different categories of macroscopic realism, and argued that only the simplest category could be ruled out by Leggett-Garg inequality violations. Allen, Maroney, and Gogioso (2016) then showed that the second of these approaches was also incompatible with quantum theory in Hilbert spaces of dimension 4 or higher. However, we show that the distinction introduced by Maroney and Timpson between the second and third approaches is not noise tolerant, so unfortunately Allen's result, as given, is not directly empirically testable. In this paper we replace Maroney and Timpson's three categories with a parameterization of macroscopic realist models, which can be related to experimental observations in a noise tolerant way, and recover the original definitions in the noise-free limit. We show how this parameterization can be used to experimentally rule out classes of macroscopic realism in Hilbert spaces of dimension 3 or higher, including the category tested by the Leggett-Garg inequality, without any use of the non-invasive measurability assumption.Comment: 20 pages, 10 figure

    Consistent Histories and the Bohm Approach

    Get PDF
    In a recent paper Griffiths claims that the consistent histories interpretation of quantum mechanics gives rise to results that contradict those obtained from the Bohm interpretation. This is in spite of the fact that both claim to provide a realist interpretation of the formalism without the need to add any new mathematical content and both always produce exactly the same probability predictions of the outcome of experiments. In constrasting the differences Griffiths argues that the consistent histories interpretation provides a more physically reasonable account of quantum phenomena. We examine this claim and show that the consistent histories approach is not without its difficulties.Comment: 8 pages, 3 figure

    Maximally epistemic interpretations of the quantum state and contextuality

    Get PDF
    We examine the relationship between quantum contextuality (in both the standard Kochen-Specker sense and in the generalised sense proposed by Spekkens) and models of quantum theory in which the quantum state is maximally epistemic. We find that preparation noncontextual models must be maximally epistemic, and these in turn must be Kochen-Specker noncontextual. This implies that the Kochen-Specker theorem is sufficient to establish both the impossibility of maximally epistemic models and the impossibility of preparation noncontextual models. The implication from preparation noncontextual to maximally epistemic then also yields a proof of Bell's theorem from an EPR-like argument.Comment: v1: 4 pages, revTeX4.1, some overlap with arXiv:1207.7192. v2: Changes in response to referees including revised proof of theorem 1, more rigorous discussion of measure theoretic assumptions and extra introductory materia
    • 

    corecore